12,442 research outputs found

    Beyond Sparsity: Tree Regularization of Deep Models for Interpretability

    Get PDF
    The lack of interpretability remains a key barrier to the adoption of deep models in many applications. In this work, we explicitly regularize deep models so human users might step through the process behind their predictions in little time. Specifically, we train deep time-series models so their class-probability predictions have high accuracy while being closely modeled by decision trees with few nodes. Using intuitive toy examples as well as medical tasks for treating sepsis and HIV, we demonstrate that this new tree regularization yields models that are easier for humans to simulate than simpler L1 or L2 penalties without sacrificing predictive power.Comment: To appear in AAAI 2018. Contains 9-page main paper and appendix with supplementary materia

    Multivariate Differential Association Analysis

    Full text link
    Identifying how dependence relationships vary across different conditions plays a significant role in many scientific investigations. For example, it is important for the comparison of biological systems to see if relationships between genomic features differ between cases and controls. In this paper, we seek to evaluate whether the relationships between two sets of variables is different across two conditions. Specifically, we assess: do two sets of high-dimensional variables have similar dependence relationships across two conditions?. We propose a new kernel-based test to capture the differential dependence. Specifically, the new test determines whether two measures that detect dependence relationships are similar or not under two conditions. We introduce the asymptotic permutation null distribution of the test statistic and it is shown to work well under finite samples such that the test is computationally efficient, making it easily applicable to analyze large data sets. We demonstrate through numerical studies that our proposed test has high power for detecting differential linear and non-linear relationships. The proposed method is implemented in an R package kerDAA

    Dynamics of Neural Networks with Continuous Attractors

    Full text link
    We investigate the dynamics of continuous attractor neural networks (CANNs). Due to the translational invariance of their neuronal interactions, CANNs can hold a continuous family of stationary states. We systematically explore how their neutral stability facilitates the tracking performance of a CANN, which is believed to have wide applications in brain functions. We develop a perturbative approach that utilizes the dominant movement of the network stationary states in the state space. We quantify the distortions of the bump shape during tracking, and study their effects on the tracking performance. Results are obtained on the maximum speed for a moving stimulus to be trackable, and the reaction time to catch up an abrupt change in stimulus.Comment: 6 pages, 7 figures with 4 caption

    Dynamical Synapses Enhance Neural Information Processing: Gracefulness, Accuracy and Mobility

    Full text link
    Experimental data have revealed that neuronal connection efficacy exhibits two forms of short-term plasticity, namely, short-term depression (STD) and short-term facilitation (STF). They have time constants residing between fast neural signaling and rapid learning, and may serve as substrates for neural systems manipulating temporal information on relevant time scales. The present study investigates the impact of STD and STF on the dynamics of continuous attractor neural networks (CANNs) and their potential roles in neural information processing. We find that STD endows the network with slow-decaying plateau behaviors-the network that is initially being stimulated to an active state decays to a silent state very slowly on the time scale of STD rather than on the time scale of neural signaling. This provides a mechanism for neural systems to hold sensory memory easily and shut off persistent activities gracefully. With STF, we find that the network can hold a memory trace of external inputs in the facilitated neuronal interactions, which provides a way to stabilize the network response to noisy inputs, leading to improved accuracy in population decoding. Furthermore, we find that STD increases the mobility of the network states. The increased mobility enhances the tracking performance of the network in response to time-varying stimuli, leading to anticipative neural responses. In general, we find that STD and STP tend to have opposite effects on network dynamics and complementary computational advantages, suggesting that the brain may employ a strategy of weighting them differentially depending on the computational purpose.Comment: 40 pages, 17 figure

    A Moving Bump in a Continuous Manifold: A Comprehensive Study of the Tracking Dynamics of Continuous Attractor Neural Networks

    Full text link
    Understanding how the dynamics of a neural network is shaped by the network structure, and consequently how the network structure facilitates the functions implemented by the neural system, is at the core of using mathematical models to elucidate brain functions. This study investigates the tracking dynamics of continuous attractor neural networks (CANNs). Due to the translational invariance of neuronal recurrent interactions, CANNs can hold a continuous family of stationary states. They form a continuous manifold in which the neural system is neutrally stable. We systematically explore how this property facilitates the tracking performance of a CANN, which is believed to have clear correspondence with brain functions. By using the wave functions of the quantum harmonic oscillator as the basis, we demonstrate how the dynamics of a CANN is decomposed into different motion modes, corresponding to distortions in the amplitude, position, width or skewness of the network state. We then develop a perturbative approach that utilizes the dominating movement of the network's stationary states in the state space. This method allows us to approximate the network dynamics up to an arbitrary accuracy depending on the order of perturbation used. We quantify the distortions of a Gaussian bump during tracking, and study their effects on the tracking performance. Results are obtained on the maximum speed for a moving stimulus to be trackable and the reaction time for the network to catch up with an abrupt change in the stimulus.Comment: 43 pages, 10 figure
    • …
    corecore